Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations

نویسندگان

  • José Luis Bernier
  • Julio Ortega
  • Ignacio Rojas
  • Alberto Prieto
چکیده

This paper proposes a version of the backpropagation algorithm which increases the tolerance of a feedforward neural network against deviations in the weight values. These changes can originate either when the neural network is mapped on a given VLSI circuit where the precision and/or weight matching are low, or by physical defects a!ecting the neural circuits. The modi"ed backpropagation algorithm we propose uses the statistical sensitivity of the network to changes in the weights as a quantitative measure of network tolerance and attempts to reduce this statistical sensitivity while keeping the "gures for the usual training performance (in errors and time) similar to those obtained with the usual backpropagation algorithm. ( 2000 Elsevier Science B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Accurate Measure for Multilayer Perception Tolerance to Additive Weight Deviations

The inherent fault tolerance of artificial neural networks (ANNs) is usually assumed, but several authors have claimed that ANNs are not always fault tolerant and have demonstrated the need to evaluate their robustness by quantitative measures. For this purpose, various alternatives have been proposed. In this paper we show the direct relation between the mean square error (MSE) and the statist...

متن کامل

Training Multilayer Perceptrons Via Minimization of Sum of Ridge Functions

Motivated by the problem of training multilayer perceptrons in neural networks, we consider the problem of minimizing E(x) = ∑ni=1 fi(ξi · x), where ξi ∈ Rs , 1 i n, and each fi(ξi · x) is a ridge function. We show that when n is small the problem of minimizing E can be treated as one of minimizing univariate functions, and we use the gradient algorithms for minimizing E when n is moderately la...

متن کامل

IDIAP Technical report

Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal value of the initial weight v ariance (or range), which is the principal parameter of random weight initialization methods for both types of neural networks. An overview of random weight...

متن کامل

Overcoming the Local-Minimum Problem in Training Multilayer Perceptrons with the NRAE-MSE Training Method

A method of training multilayer perceptrons (MLPs) to reach a global or nearly global minimum of the standard mean squared error (MSE) criterion is proposed. It has been found that the region in the weight space that does not have a local minimum of the normalized riskaverting error (NRAE) criterion expands strictly to the entire weight space as the risk-sensitivity index increases to infinity....

متن کامل

Steganalysis of embedding in difference of image pixel pairs by neural network

In this paper a steganalysis method is proposed for pixel value differencing method. This steganographic method, which has been immune against conventional attacks, performs the embedding in the difference of the values of pixel pairs. Therefore, the histogram of the differences of an embedded image is di_erent as compared with a cover image. A number of characteristics are identified in the di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 31  شماره 

صفحات  -

تاریخ انتشار 2000